Goto

Collaborating Authors

 new dimension


Interpolation, Extrapolation, Hyperpolation: Generalising into new dimensions

Ord, Toby

arXiv.org Artificial Intelligence

Interpolation and extrapolation are two kinds of generalisation: ways of applying an idea in a broader domain than we've seen so far. Interpolation asks what lies between the examples we've already seen, while extrapolation asks what lies beyond. They are widely used concepts that find technical application within science and engineering (especially in data science, numerical analysis, machine learning, economics, and computer graphics), where many different mathematical methods for interpolation and extrapolation are used. These concepts are also used in a less formal manner in many other areas, such as philosophy, art, and futurism, where we might make more qualitative interpolations and extrapolations. For instance, we might ask whether an emerging kind of music is mainly an interpolation between two existing genres; or whether affordable land travel, sea travel, and air travel are likely to be followed by affordable space travel. In all these contexts, interpolation and extrapolation are seen as twin concepts; an inseparable pair. I want to suggest that what appear to be twins are in fact two triplets -- that they have a hitherto unknown sibling.


AI writing has entered a new dimension, and it's going to change education

#artificialintelligence

What happens when robots not only learn to write well, but the tech becomes easily accessible and cheap? As Hal Crawford explains, it'll likely be teachers who feel the effects first. There are two schools of thought when it comes to artificial intelligence: there are the people who have heard of the GPT-3 language model, and then there are those who have heard about it, gone to the OpenAI site, created a guest login and tried it out for themselves. The first group contains people who are wondering what the big deal is. The second group does not. I haven't heard of anyone who's actually used GPT-3 and doesn't think AI is going to change the world profoundly. Education in particular is going to feel its influence immediately.


Xing

AAAI Conferences

The proposed Perception Evolution Network (PEN) is a biologically inspired neural network model for unsupervised learning and online incremental learning. It is able to automatically learn suitable prototypes from learning data in an online incremental way, and it does not require the predefined prototype number and similarity threshold. Meanwhile, being more advanced than the existing unsupervised neural network model, PEN permits the emergence of a new dimension of perception in the perception field of the network. When a new dimension of perception is introduced, PEN is able to integrate the new dimensional sensory inputs with the learned prototypes, i.e., the prototypes are mapped to a high-dimensional space, which consists of both the original dimension and the new dimension of the sensory inputs. We call it a Cognition Deepening Process. Artificial data and real-world data are used to test the proposed PEN, and the results show that PEN can work effectively.


Artificial Intelligence Will be the Commander of the Future Wars

#artificialintelligence

Artificial intelligence is one of several hot technologies that have the potential to transform the face of combat in the next years. The Joint Artificial intelligence Center was established by the Department of Defense to win the artificial intelligence war. AI might enable autonomous systems to execute missions, achieve sensor fusion, automate activities, and make better, faster judgments than people, according to some visions. AI is quickly developing, and those objectives may be met shortly. Meanwhile, artificial intelligence will influence the more routine, boring, and monotonous duties that military personnel undertake in uncontested situations.


How Big Data Carried Graph Theory Into New Dimensions

WIRED

The mathematical language for talking about connections, which usually depends on networks--vertices (dots) and edges (lines connecting them)--has been an invaluable way to model real-world phenomena since at least the 18th century. But a few decades ago, the emergence of giant data sets forced researchers to expand their toolboxes and, at the same time, gave them sprawling sandboxes in which to apply new mathematical insights. Since then, said Josh Grochow, a computer scientist at the University of Colorado, Boulder, there's been an exciting period of rapid growth as researchers have developed new kinds of network models that can find complex structures and signals in the noise of big data. Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research develop ments and trends in mathe matics and the physical and life sciences. Grochow is among a growing chorus of researchers who point out that when it comes to finding connections in big data, graph theory has its limits.


Breaking the Memory Wall for AI Chip with a New Dimension

#artificialintelligence

Recent advancements in deep learning have led to the widespread adoption of artificial intelligence (AI) in applications such as computer vision and natural language processing. As neural networks become deeper and larger, AI modeling demands outstrip the capabilities of conventional chip architectures. Memory bandwidth falls behind processing power. Energy consumption comes to dominate the total cost of ownership. Currently, memory capacity is insufficient to support the most advanced NLP models.


Archaeological search engine adds a new dimension to 'digging'

AIHub

Apps that can precisely identify shards, coins or heel bones: archaeology has embraced artificial intelligence. Alex Brandsen is working on a search engine that scans vast quantities of text from an archaeological viewpoint. An archaeologist by training, he spent time working as a programmer, before returning to University to study for a PhD combining the two "I've noticed at [archaeology] conferences over the last two years that AI has become a real buzzword, and a lot of money and energy are going into it." Brandsen is working on a search engine for archaeologists that can quickly and effectively scan all the excavation reports of Dutch finds. "For example, if you search under burial rites in the Middle Ages, the search engine needs to understand that the term 1200 CE is also relevant. There are thousands of terms that mean Middle Ages and it has to find them all. It must also be able to distinguish between a bill as a bladed weapon and a researcher whose name is Bill."


The Aha! Moments In 4 Popular Machine Learning Algorithms

#artificialintelligence

Each step, the Decision Tree algorithm attempts to find a method to build the tree such that the entropy is minimized. Think of entropy more formally as the amount of'disorder' or'confusion' a certain divider (the conditions) has, and its opposite as'information gain' -- how much a divider adds information and insight to the model. Feature splits that have the highest information gain (as well as a lowest entropy) are placed at the top. Note that condition 1 has clean separation, and therefore low entropy and high information gain. The same cannot be said for condition 3, which is why it is placed near the bottom of the Decision Tree.


How AI is bringing a new dimension to software testing - Cloud Computing News

#artificialintelligence

Software testing teams analyse and correct thousands of code on a daily basis to ensure the final product is free of errors. However, the on-demand customer expects software to be comprehensive in functionality and delivered with precision and speed. Current software testing procedures are not scalable to meet these needs, nor are they cost- or time-efficient in the digital economy. As products become more complex to create, the code becomes more challenging to test accurately. Manual testing exposes development teams to many challenges--code changes causing errors elsewhere in the product, the considerable length of regression testing cycles, resourcing constraints of hiring skilled software testers to meet demand, and more.


New Air Force stealth bomber arrives in just '2 years'

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. The much-anticipated, high-tech B-21 bomber will "come on in two years," bringing new dimensions of stealth, software, attack possibilities and nuclear deterrence to the U.S. Air Force. It would even possibly usher in new tactical approaches to how modern operations may move forward in the years ahead. In a conversation with the Mitchell Institute for Aerospace Studies regarding the importance of modernizing the nuclear triad, Air Force Chief of Staff General Stephen Wilson confirmed that the stealthy new aircraft will "come on in two years."